- 6 minutes to read

How to enable Logging from Boomi Integration solutions to Nodinite

Learn how to perform Asynchronous Logging from Boomi Integration to Nodinite.

You must add a custom logging code to your Boomi Integration components to enable logging business transactions to Nodinite.

graph LR subgraph "Boomi Integration Component" roBroker(fal:fa-brackets-curly Reusable generic logging component)-->roSink(fal:fa-file-export Intermediate storage ) end subgraph "Nodinite instance" roPS(fal:fa-truck-pickup Pickup Service) -->roNI(fal:fa-cloud-download Log API) roSink --> roPS end

Enable self-service Log Views for your business by implementing the following:
Payload logging
Context logging (key-value)
Process execution logging from selected components (any number)


Step by step guide

Follow the steps outlined below.

In your Boomi Integration components, make sure to add custom logging (regardless of the target). Typically this is done using the Reusable generic logging component or other similar frameworks depending on your design, policy and needs

TIP: Make sure your Logging strategy is not vendor-specific. Your Logging logic should be easy to enable/disable and it should be possible to manage the log destinations without redesigning the solutions already in production.

graph LR subgraph "Systems Integrations Solutions" roStrategy(fal:fa-lightbulb-on Logging Strategy) --> roBroker{fal:fa-brackets-curly Reusable
Generic Logging logic} roBroker --> |Nodinite JSON Log Event|roSink(fal:fa-file-export Intermediate storage ) roBroker -.-> |"The other format(s)"| ro2["Other Log destination(s)"] end subgraph "Nodinite instance" roPS(fal:fa-truck-pickup Pickup Service) -->roNI(fal:fa-cloud-download Log API) roSink --> roPS end

Logging Components
Generic logging in Boomi component

1. Setup the Log properties structure

From selected components, you will eventually perform Logging. The Logging is individual to each process, and you need a common defined structure to be applied when Logging.

To create this structure, follow the steps below:

Add new folder structure

  1. Navigate to the Build tab
  2. In the main tree, you can either re-use or create a new folder structure for the reusable properties shape
  3. If you are not re-using an existing folder structure; Create a new folder

Then, click on the New button.
New button

This opens the Create Component modal.
Create Component

  1. Select Process Property
  2. Name the generic log structure (i.e. LogProperties)
  3. Select the destination folder
  4. Click the Create button

Next, you must add the common set of properties required as part of your Logging strategy. Open the newly created LogProperties component.

Open LogProperties component

In the Edit Properties page, add the set of properties your logging strategy requires.

Set default Properties
List of common properties used for Nodinite Logging

Nodinite Logging requires the following properties. You can either add other properties of interest to the existing definition of a Nodinite JSON Log Event, or add them to the generic key-value collection (Context).

Mandatory Data Type Field Value Comment Boomi specific
number LogAgentValueId 42 Who (Log Agents) sent the data From the groovy script
string EndPointName "INT101: Receive Hello World Log Events" Name of Endpoint transport Set in the flow component where the name of the endpoint is known
string EndPointUri "C:\DropArea\in" [URI][] for Endpoint transport Set in the flow component where the Uri is known
number EndPointDirection 0 Direction for Endpoint transport Define the property with a a pre-defines set of allowed values
Endpoint directions
number EndPointTypeId 0 Type of Endpoint transport Define as number, and set as defined by Nodinite standard
EndpointTypeIds
string OriginalMessageTypeName "https://nodinite.com/Customers/1.0#Batch" Message Type Name Set in the flow component where the message type is known
string LogDateTime "2018-05-03T13:37:00.123Z" Client Log datetime (UTC format) Set by the generic reusable map described later

Some of these properties, including the optional payload are set in the mapping stage that comes later in this guide.

  1. From the Build section; Add a Set Properties shape to your workflow (component).
    Add a Set Properties shape
  2. Click the Configure button.
    Configure Set Properties Shape button
  3. Add the Generic reusable logging component to your workflow (component).

    This component is detailed in the next step

2. Convert the logged event to Nodinite format

Next, you must create the Generic reusable logging component. In this guide, we help you create, and write a Nodinite JSON Log Event to the local file system as the intermediate storage.

Reusable Logging Component

  1. In your folder structure; Create a new Component, name it Reusable Generic Logging Component

  2. Set the Start shape properties

    1. Name it "Logger"
    2. Set the Type to Data Passthrough
  3. Add a Data Process shape to convert the payload to a base64encoded internal
    Convert to base64

    1. Name it "Base64Encode payload"
    2. Set the Process Type to Custom Scripting
    3. Set the Script Source to Inline Scripting
    4. Select Groovy 1.5
    5. Set the Script to:
    import java.util.Properties;
    import java.io.InputStream;
    import java.text.DateFormat;
    import java.text.SimpleDateFormat;
    import java.util.Calendar;
    import java.util.TimeZone;
    
    for( int i = 0; i < dataContext.getDataCount(); i++ ) {
        InputStream is = dataContext.getStream(i);
        Properties props = dataContext.getProperties(i);
        def inputDataString = is.getText("UTF8").bytes.encodeBase64().toString();
        dataContext.storeStream(new ByteArrayInputStream(inputDataString.getBytes()), props);
    }
    
  4. In sequence; Add a Map Shape
    Map Shape

  5. Edit the Map

    1. Set the Body using the Data Process Shape from step 3
    2. Set the other properties using the Log Properties record (uniquely populated with data from the component-designer for selected flows) Transformation

To uniquely identify the Log Agent, you must provide a LogAgentValueId. To separate Logging from the different environments (e.g. Prod, Test, QA); It would be best to allow the LogAgentValueId (mandatory number) to be a configurable part of the Boomi Integration solution.

Make sure to get the best out of the information you have. Map existing properties to the standard properties and attach context and payload as needed

3. Send Event to intermediate storage

Your intermediate storage should be close to the Boomi Integration installation (i.e. local) and highly available.

Send the Nodinite JSON Log Event to the intermediate storage

  1. In the sequence; Add a Connector Shape
  2. Set the Display Name
  3. Select Connector - in this case the Disk
  4. Select Send as Action
  5. Name the Connection and set the destination path (e.g. C:\temp\boomi\Log Files)
  6. Name the Operation and set the New Disk Connector Operation properties Disk Connector
Do not log events synchronously to the Nodinite Log API. You should honour the asynchronous pattern using the :Nodinite: [Pickup Service][].

Asynchronous Logging

4. Configure the Nodinite pickup Service

The Nodinite Pickup Service can fetch the logged events from many intermediate storages.

Source Description Recommended Monitoring Agent External Link Configuration
ActiveMQ Fetch Log Events from ActiveMQ/ActiveMQ Artemis queues Message Queuing Agent Apache NMS ActiveMQ Configuration
AnypointMQ Fetch Log Events from MuleSoft Cloudhub AnypointMQ platform Message Queuing Agent AnypointMQ Configuration
Azure Event Hub Fetch Log Events from EventHub Azure Monitoring agent EventHub Configuration
Azure ServiceBus Fetch Log Events from Azure Service Bus Message Queuing Agent Azure Service Bus Configuration
Disk / Folder Fetch Log Events from file folders and SMB enabled shares File Monitoring Agent Configuration
Microsoft MSMQ Fetch Log Events from Microsoft MSMQ Message Queuing Agent Configuration
Microsoft SQL Server Fetch Log Events from Microsoft SQL Server Database Monitoring Agent Configuration
PostgreSQL Fetch Log Events from PostgreSQL database instances Database Monitoring Agent PostgreSQL Configuration

Missing a source? please contact our support, support@nodinite.com, and we will build it for you(!).

5. Enable self-service from within Nodinite

Now that you have data logged inside Nodinite, you can create Role-based Log Views and invite your business to share insights and data from your logging.

Frequently asked questions

Boomi Integration Logging uses the Nodinite Pickup Service.

Additional solutions to common problems and the FAQ for the Nodinite Pickup Service exists in the Pickup Troubleshooting user guide.

How do I get started?

First, read our Asynchronous Logging user guide.

How do I access the Logged data?

Access to logged data for Users is provided by the use of Log Views with Role-based security.


Next Step

Install Nodinite Pickup Service

Log Views
Asynchronous Logging